翻訳と辞書
Words near each other
・ Inese Jaunzeme
・ Inese Lībiņa-Egnere
・ Inese Vaidere
・ Inese Šlesere
・ Ineson Glacier
・ Iness Chepkesis Chenonge
・ Inessa
・ Inessa (skipper)
・ Inessa Armand
・ Inessa Korkmaz
・ Inessa Kravets
・ Inessa Lisovskaya
・ Inessa Merkulova
・ Inessive case
・ Inepta Cove
Inequalities in information theory
・ Inequality
・ Inequality (mathematics)
・ Inequality by Design
・ Inequality for All
・ Inequality in Bolivia
・ Inequality in disease
・ Inequality in Germany
・ Inequality in Hollywood
・ Inequality in post-apartheid South Africa
・ Inequality of arithmetic and geometric means
・ Inequality of bargaining power
・ Inequality Reexamined
・ Inequality within immigrant families in the United States
・ Inequality within publicly traded companies


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Inequalities in information theory : ウィキペディア英語版
Inequalities in information theory
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.
==Shannon-type inequalities==
Consider a finite collection of finitely (or at most countably) supported random variables on the same probability space. For a collection of ''n'' random variables, there are 2''n'' − 1 such non-empty subsets for which entropies can be defined. For example, when ''n'' = 2, we may consider the entropies H(X_1), H(X_2), and H(X_1, X_2), and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables):
* H(X_1) \ge 0
* H(X_2) \ge 0
* H(X_1) \le H(X_1, X_2)
* H(X_2) \le H(X_1, X_2)
* H(X_1, X_2) \le H(X_1) + H(X_2).
In fact, these can all be expressed as special cases of a single inequality involving the conditional mutual information, namely
:I(A;B|C) \ge 0,
where A, B, and C each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of random variables. Inequalities that can be derived from this are known as Shannon-type inequalities. More formally (following the notation of Yeung 〔)〕), define \Gamma^
*_n to be the set of all ''constructible'' points in \mathbb R^, where a point is said to be constructible if and only if there is a joint, discrete distribution of ''n'' random variables such that each coordinate of that point, indexed by a non-empty subset of , is equal to the joint entropy of the corresponding subset of the ''n'' random variables. The closure of \Gamma^
*_n is denoted \overline. In general
:\Gamma^
*_n \subseteq \overline \subseteq \Gamma_n.
The cone in \mathbb R^ characterized by all Shannon-type inequalities among ''n'' random variables is denoted \Gamma_n. Software has been developed to automate the task of proving such inequalities

.
Given an inequality, such software is able to determine whether the given inequality contains the cone \Gamma_n, in which case the inequality can be verified, since \Gamma^
*_n \subseteq \Gamma_n.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Inequalities in information theory」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.